Solving Stochastic Compositional Optimization is Nearly as Easy as Solving Stochastic Optimization

نویسندگان

چکیده

Stochastic compositional optimization generalizes classic (non-compositional) stochastic to the minimization of compositions functions. Each composition may introduce an additional expectation. The series expectations be nested. is gaining popularity in applications such as reinforcement learning and meta learning. This paper presents a new Stochastically Corrected Compositional gradient method (SCSC). SCSC runs single-time scale with single loop, uses fixed batch size, guarantees converge at same rate descent (SGD) for non-compositional optimization. achieved by making careful improvement popular method. It easy apply SGD-improvement techniques accelerate SCSC. helps achieve state-of-the-art performance In particular, we Adam SCSC, exhibited convergence matches that original on We test using model-agnostic meta-learning tasks.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Optimization of Querying Distributed Databases Ii. Solving Stochastic Optimization

General stochastic query optimization (GSQO) problem for multiple join — join of p relations which are stored at p different sites — is presented. GSQO problem leads to a special kind of nonlinear programming problem (P ). Problem (P ) is solved by using a constructive method. A sequence converging to the solution of the optimization problem is built. Two algorithms for solving optimization pro...

متن کامل

Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems

We consider the fundamental problem of solving quadratic systems of equations in n variables, where yi = |〈ai,x〉|, i = 1, . . . ,m and x ∈ R is unknown. We propose a novel method, which starting with an initial guess computed by means of a spectral method, proceeds by minimizing a nonconvex functional as in the Wirtinger flow approach [11]. There are several key distinguishing features, most no...

متن کامل

Solving nearly-separable quadratic optimization problems as nonsmooth equations

An algorithm for solving nearly-separable quadratic optimization problems (QPs) is presented. The approach is based on applying a semismooth Newton method to solve the implicit complementarity problem arising as the first-order stationarity conditions of such a QP. An important feature of the approach is that, as in dual decomposition methods, separability of the dual function of the QP can be ...

متن کامل

Design Is as Easy as Optimization

We consider the class of max-min and min-max optimization problems subject to a global budget constraint. We undertake a systematic algorithmic and complexity-theoretic study of such problems, which we call problems design problems. Every optimization problem leads to a natural design problem. Our main result uses techniques of Freund-Schapire [FS99] from learning theory, and its generalization...

متن کامل

Solving single facility goal Weber location problem using stochastic optimization methods

Location theory is one of the most important topics in optimization and operations research. In location problems, the goal is to find the location of one or more facilities in a way such that some criteria such as transportation costs, customer traveling distance, total service time, and cost of servicing are optimized. In this paper, we investigate the goal Weber location problem in which the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Signal Processing

سال: 2021

ISSN: ['1053-587X', '1941-0476']

DOI: https://doi.org/10.1109/tsp.2021.3092377